tokenization_bart.py: return_tensors default should be "pt"#8556
tokenization_bart.py: return_tensors default should be "pt"#8556Mehrad0711 wants to merge 1 commit intohuggingface:masterfrom
Conversation
|
Tokenizers are supposed to be framework (PyTorch/TensorFlow/FLAX) agnostic so we probably don't want to do in that direction. |
|
Gotcha. Is this going to be the case for all tokenizers in the future? Because currently they default to PyTorch except for Bart's. |
|
The fact that the BART-like tokenizers have |
|
We will have to update this, which will be a breaking change, so we'll try to put it in the v4.0.0. Do you want to open a PR to fix the issue? |
|
Sure. I'll close this then and make a new PR for that. |
|
Hi @Mehrad0711! We're rushing to If you have, you can push your fixes and open a PR and I'll incorporate those changes in my PR and mark you as co-author. |
|
No problem @LysandreJik . Thanks for fixing it! |
return_tensors default should be "pt" in bart's
prepare_seq2seq_batch.